Home |
| Latest | About | Random
# Commuting matrices have a common eigenvector - A proof of FTA. We will establish the following theorem using linear algebra: > **Theorem.** > If $A_1,A_2,\ldots,A_k$ are any collection of $k$ many square matrices over $\mathbb{C}$ such that they pairwise commute, then they share a common eigenvector $v$. And we will have as a corollary, > **Corollary. Fundamental theorem of algebra.** > Any complex polynomial $p(z)$ has at least one root in $\mathbb{C}$. So to prove our main theorem, we will avoid using FTA, and thus giving a linear algebra proof of FTA. This proof is from Derksen[^1]. This set of problems are just guiding you through Derksen's proof. Feel free to look it up! ![[---images/---assets/---icons/question-icon.svg]] Show this theorem implies the fundamental theorem of algebra. Hint: Take a complex polynomial $p$, and consider its corresponding companion matrix. To prove the main theorem, we will do so in several steps. Let us denote $P(k,d,\mathbb{F})$ to mean the following statement: > $P(k,d,\mathbb{F})=$ > Suppose $A_1,\ldots,A_k$ are $k$ many $n\times n$ matrices over the field $\mathbb{F}$, where $d$ does not divide $n$, such that they pairwise commute, then they share a common eigenvector. So we would like to show $P(k,d,\mathbb{C})$ is true for all $k$ and all $d$. ![[---images/---assets/---icons/question-icon.svg]] It suffices to show $P(k,2^\ell,\mathbb{C})$ is true for all $k$ and $\ell$. Why? Ok, to let us build up to this. ![[---images/---assets/---icons/question-icon.svg]] Show if $P(1,d,\mathbb{F})$ is true, then $P(k,d,\mathbb{F})$ is true for all $k$. Hint: Suppose $P(k',d,\mathbb{F})$ is true for all $k' < k$, to show $P(k,d,\mathbb{F})$ is true, do induction on $n$, the dimension of the space these matrices act on. The case $n=1$ is trivial. Now take $A_1,\ldots,A_k$ be $k$ commuting $n\times n$ matrices over $\mathbb{F}$, with $d \not\mid n$. Use the fact that $P(1,d,\mathbb{F})$ is true so that $A_k$ has some eigenvalue $\lambda$. Now consider the spaces $K=\ker(A_k-\lambda I)$ and $W=\operatorname{im}(A_k-\lambda I)$. Observe that $A_1,\ldots,A_{k-1}$ are all stable on $K,W$. Conclude they all share a common eigenvector on a possibly smaller dimensional space. ![[---images/---assets/---icons/question-icon.svg]] Show $P(k,2,\mathbb{R})$ is true for all $k$. ![[---images/---assets/---icons/question-icon.svg]] Show $P(1,2,\mathbb{C}$) is true. And hence $P(k,2,\mathbb{C})$ is true for all $k$. Hint: To do this, take an $n\times n$ matrix $A$ over $\mathbb{C}$. Consider the linear maps $L_1,L_2: \operatorname{Herm}_n(\mathbb{C})\to\operatorname{Herm}_n(\mathbb{C})$ such that $L_1(X)=\frac{1}{2}(AX+XA^\ast)$, and $L_2(X) = \frac{1}{2i}(AX-XA^\ast)$. Show $L_1$ and $L_2$ share a common eigenvector $M$ (which is a matrix in $\operatorname{Herm}_n(\mathbb{C})$). Now compute what $(L_1 + iL_2)M$ is, to show $A$ has an eigenvector. ![[---images/---assets/---icons/question-icon.svg]] Show $P(1,2^\ell,\mathbb{C})$ is true for all $\ell$. Hence $P(k,2^\ell,\mathbb{C})$ is true. Hint: To do this, take an $n\times n$ matrix $A$ over $\mathbb{C}$. Consider the linear maps $L_1,L_2:\operatorname{Skew}_n(\mathbb{C})\to\operatorname{Skew}_n(\mathbb{C})$ such that $L_1(X) = AX-XA^T$ and $L_2(X)=AXA^T$. Show $L_1$ and $L_2$ have a common eigenvector $M$. Now, use the fact that $L_1(M)=\lambda M$ and $L_2(M)=\mu M$, we have $(A^2-\lambda A-\mu)M=0$. Conclude that we have an eigenvector for $A$. === [^1]: Derksen, The fundamental theorem of algebra and linear algebra.